189 research outputs found

    Internet's Critical Path Horizon

    Full text link
    Internet is known to display a highly heterogeneous structure and complex fluctuations in its traffic dynamics. Congestion seems to be an inevitable result of user's behavior coupled to the network dynamics and it effects should be minimized by choosing appropriate routing strategies. But what are the requirements of routing depth in order to optimize the traffic flow? In this paper we analyse the behavior of Internet traffic with a topologically realistic spatial structure as described in a previous study (S-H. Yook et al. ,Proc. Natl Acad. Sci. USA, {\bf 99} (2002) 13382). The model involves self-regulation of packet generation and different levels of routing depth. It is shown that it reproduces the relevant key, statistical features of Internet's traffic. Moreover, we also report the existence of a critical path horizon defining a transition from low-efficient traffic to highly efficient flow. This transition is actually a direct consequence of the web's small world architecture exploited by the routing algorithm. Once routing tables reach the network diameter, the traffic experiences a sudden transition from a low-efficient to a highly-efficient behavior. It is conjectured that routing policies might have spontaneously reached such a compromise in a distributed manner. Internet would thus be operating close to such critical path horizon.Comment: 8 pages, 8 figures. To appear in European Journal of Physics B (2004

    The long and winding road : accidents and tinkering in software standardization

    Get PDF
    Software is based on universal principles but not its development. Relating software to hardware is never automatic or easy. Attempts to optimize software production and drastically reduce their costs (like in hardware) have been very restricted. Instead, highly-skilled and experienced individuals are ultimately responsible for project success. The long and convoluted path towards useful and reliable software is often plagued by idiosyncratic accidents and emergent complexity. It was expected that software standardisation would remove these sources of unwanted diversity by aiming at controllable development processes, universal programming languages, and toolkits of reusable software components. However, limited adoption of development standards suggests that we still do not understand why software is so difficult to produce. Software standardisation has been limited by our poor understanding of humans? role at the origin of technological diversity

    A simple spatiotemporal evolution model of a transmission power grid

    Get PDF
    In this paper, we present a model for the spatial and temporal evolution of a particularly large human-made network: the 400-kV French transmission power grid. This is based on 1) an attachment procedure that diminishes the connection probability between two nodes as the network grows and 2) a coupled cost function characterizing the available budget at every time step. Two differentiated and consecutive processes can be distinguished: a first global space-filling process and a secondary local meshing process that increases connectivity at a local level. Results show that even without power system engineering design constraints (i.e., population and energy demand), the evolution of a transmission network can be remarkably explained by means of a simple attachment procedure. Given a distribution of resources and a time span, the model can also be used to generate the probability distribution of cable lengths at every time step, thus facilitating network planning. Implications for network's fragility are suggested as a starting point for new design perspectives in this kind of infrastructures.Peer ReviewedPostprint (author's final draft
    • …
    corecore